Cross-validation for the LASSO Kullback-Leibler divergence based regression: Cross-validation for the LASSO Kullback-Leibler divergence based regression
Description
Cross-validation for the LASSO Kullback-Leibler divergence based regression.
A numerical matrix with compositional data with or without zeros.
x
A matrix with the predictor variables.
alpha
The elastic net mixing parameter, with \(0 \leq \alpha \leq 1\). The penalty is defined as a weighted combination of the ridge and of the Lasso regression. When \(\alpha=1\) LASSO is applied, while \(\alpha=0\) yields the ridge regression.
nfolds
The number of folds for the K-fold cross validation, set to 10 by default.
folds
If you have the list with the folds supply it here. You can also leave it NULL and it will create folds.
seed
If seed is TRUE the results will always be the same.
graph
If graph is TRUE (default value) a filled contour plot will appear.
Value
The outcome is the same as in the R package glmnet. The extra addition is that if "graph = TRUE", then the plot of the cross-validated object is returned. The contains the logarithm of \(\lambda\) and the deviance. The numbers on top of the figure show the number of set of coefficients for each component, that are not zero.
Details
The K-fold cross validation is performed in order to select the optimal value for \(\lambda\), the penalty parameter in LASSO.
References
Friedman, J., Hastie, T. and Tibshirani, R. (2010) Regularization Paths for Generalized Linear Models via Coordinate Descent. Journal of Statistical Software, Vol. 33(1), 1-22.